Search results for "Visual control"
showing 10 items of 11 documents
Planning an action.
1997
The motor control of a sequence of two motor acts forming an action was studied in the present experiment. The two analysed motor acts were reaching-grasping an object (first target) and placing it on a second target of the same shape and size (experiment 1). The aim was to determine whether extrinsic properties of the second target (i.e. target distance) could selectively influence the kinematics of reaching and grasping. Distance, position and size of both targets were randomly varied across the experimental session. The kinematics of the initial phase of the first motor act, that is, velocity of reaching and hand shaping of grasping, were influenced by distance of the second target. No k…
On orienting the hand to reach and grasp an object.
1996
Subjects were required to reach and grasp a parallelepiped, the position, orientation and size of which were varied. The kinematics of reaching and grasping movements was studied in full vision and in no vision conditions. Both direction and movement amplitude of reaching were affected by object orientation. Conversely, both the time course of finger axis orientation and the angular displacement of the hand at wrist were influenced by object position. These results were not modified by the absence of visual control. Finger aperture during grasping was affected by both object size and orientation. This latter result was not due to a distorted size perception, as shown by a control matching e…
Visual-based Feedback Control of Casting Manipulation
2006
In this paper, we present a method to control casting manipulation by means of real-time visual feedback. Casting manipulation is a technique to deploy a robotic end-effector at large distances from the robot’s base, by throwing the end-effector and controlling its ballistic flight using forces transmitted through a light tether connected to the end-effector itself. The tether cable can also be used to retrieve the end-effector and exert forces on the robot’s environment. Previous work has shown that casting manipulation is able to catch objects at a large distance, proving it viable for applications such as sample acquisition and return, rescue, etc. In previous experiments, the position o…
Visual Control of a Robotic Hand
2004
The paper deals with the design and implementation of a visual control of a robotic system composed of a dexterous hand and stereo cameras. The aim of the proposed system is to reproduce the movements of a human hand in order to learn complex manipulation tasks. A novelty algorithm for a robust and fast fingertips localization and tracking is presented. Moreover, a simulator is integrated in the system to give useful feedbacks to the users during operations, and provide robust testing framework for real experiments (see video).
Direction-dependent activation of the insular cortex during vertical and horizontal hand movements
2016
International audience; The planning of any motor action requires a complex multisensory processing by the brain. Gravity - immutable on Earth - has been shown to be a key input to these mechanisms. Seminal fMRI studies performed during visual perception of falling objects and self-motion demonstrated that humans represent the action of gravity in parts of the cortical vestibular system; in particular, the insular cortex and the cerebellum. However, little is known as to whether a specific neural network is engaged when processing non-visual signals relevant to gravity. We asked participants to perform vertical and horizontal hand movements without visual control, while lying in a 3T-MRI sc…
Stereotactic biopsies guided by an optical navigation system: technique and clinical experience.
2002
Frame-based stereotactic biopsies are time-consuming procedures making necessary head fixation in a ring, explicit coordinate calculation and setting of the parameters. Frameless systems make many of these intermediate steps unnecessary, impose less mechanical restrictions regarding access to the lesions, and with slight modifications can be used to perform stereotactic biopsies. A special adaptation designed to fix the holder and the biopsy instrument is described. The neuronavigation optical tracking system of Radionics was used. CT scans were performed with 6 skin markers. Calibration was performed after head fixation in the Mayfield clamp. Mean calibration error was 2.19 +/- 0.81 mm. Th…
Oculovestibular interactions under microgravity.
1993
On a space mission in March 1992 a set of experiments were performed aimed at clarifying the interaction between visual, proprioceptive and vestibular inputs to the equilibrium system. Using the VESTA goggle facility from the European Space Agency we investigated the effect of pure neck receptor stimulation on eye position as measured by the flash afterimage method and on perception of a head-fixed luminous line in space. Space vestibular adaptation processes were measured by rotating pattern perception during prescribed head movements. It was found that static ocular counterrotation does not occur under micro gravity conditions. This result suggests that the neck receptors apparently do no…
Transnasal endoscopy for direct visual control of esophageal stent placement without fluoroscopy
2012
Placement of self-expanding metal stents (SEMSs) is a well-established treatment for esophageal stenosis and postoperative anastomotic leaks. Conventional endoscopic procedures for SEMS placement require fluoroscopic guidance, but transnasal endoscopy (TNE) with ultraslim endoscopes may allow precise stent release under direct visual control without the need for fluoroscopy. This prospectively collected data investigated the feasibility and safety of TNE-guided SEMS placement without fluoroscopy. Between March 2009 and February 2011, 20 consecutive patients underwent TNE-guided SEMS placement without fluoroscopy. The technical success rate was 100 % and no fluoroscopy was required during th…
AcouMotion – An Interactive Sonification System for Acoustic Motion Control
2006
This paper introduces AcouMotion as a new hard-/software system for combining human body motion, tangible interfaces and sonification to a closed-loop human computer interface that allows non-visual motor control by using sonification (non-speech auditory displays) as major feedback channel. AcouMotion's main components are (i) a sensor device for measuring motion parameters (ii) a computer simulation to represent the dynamical evolution of a model world, and (iii) a sonification engine which generates an auditory representation of objects and any interactions in the model world. The intended applications of AcouMotion range from new kinds of sport games that can be played without visual di…
Right-handers and left-handers have different representations of their own hand
1998
The visual control of our own hand when dealing with an object and the observation of interactions between other people's hand and objects can be involved in the construction of internal representations of our own hand, as well as in hand recognition processes. Therefore, a different effect on handedness recognition is expected when subjects are presented with hands holding objects with either a congruent or an incongruent type of grip. Such an experiment was carried out on right-handed and left-handed subjects. We expected that the different degree of lateralisation in motor activities observed in the two populations [J. Herron, Neuropsychology of left- handedness, Academic Press, New York…